Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract We present a position and orientation controller for a hybrid rigid-soft manipulator arm where the soft arm is extruded from a two degrees-of-freedom rigid link. Our approach involves learning the dynamics of the hybrid arm operating at 4Hz and leveraging it to generate optimal trajectories that serve as expert data to learn a control policy. We performed an extensive evaluation of the policy on a physical hybrid arm capable of jointly controlling rigid and soft actuation. We show that with a single policy, the arm is capable of reaching arbitrary poses in the workspace with 3.73cm (<6% overall arm length) and 17.78 deg error within 12.5s, operating at different control frequencies, and controlling the end effector with different loads. Our results showcase significant improvements in control speed while effectively controlling both the position and orientation of the end effector compared to previous quasistatic controllers for hybrid arms.more » « lessFree, publicly-accessible full text available July 1, 2026
-
Free, publicly-accessible full text available December 9, 2025
-
Free, publicly-accessible full text available December 1, 2025
-
This paper describes a system for visually guided autonomous navigation of under-canopy farm robots. Low-cost under-canopy robots can drive between crop rows under the plant canopy and accomplish tasks that are infeasible for over-the-canopy drones or larger agricultural equipment. However, autonomously navigating them under the canopy presents a number of challenges: unreliable GPS and LiDAR, high cost of sensing, challenging farm terrain, clutter due to leaves and weeds, and large variability in appearance over the season and across crop types. We address these challenges by building a modular system that leverages machine learning for robust and generalizable perception from monocular RGB images from low-cost cameras, and model predictive control for accurate control in challenging terrain. Our system, CropFollow, is able to autonomously drive 485 meters per intervention on average, outperforming a state-of-the-art LiDAR based system (286 meters per intervention) in extensive field testing spanning over 25 km.more » « less
An official website of the United States government

Full Text Available